Navigation in Difficult Environments: Multi-Sensor Fusion Techniques

نویسندگان

  • Andrey Soloviev
  • Mikel M. Miller
چکیده

This paper focuses on multi-sensor fusion for navigation in difficult environments where none of the existing navigation technologies can satisfy requirements for accurate and reliable navigation if used in a stand-alone mode. A generic multi-sensor fusion approach is presented. This approach builds the navigation mechanization around a self-contained inertial navigator, which is used as a core sensor. Other sensors generally derive navigation-related measurements from external signals, such as Global Navigation Satellite System (GNSS) signals and signals of opportunity (SOOP), or external observations, for example, features extracted from images of laser scanners and video cameras. Depending on a specific navigation mission, these measurements may or may not be available. Therefore, externally-dependent sources of navigation information (including GNSS, SOOP, laser scanners, video cameras, pseudolites, Doppler radars, etc.) are treated as secondary sensors. When available, measurements of a secondary sensor or sensors are utilized to reduce drift in inertial navigation outputs. Inertial data are applied to improve the robustness of secondary sensors’ signal processing. Applications of the multi-sensor fusion approach are illustrated in details for two case studies: 1) integration of Global Positioning System (GPS), laser scanner and inertial navigation; and, 2) fusion of laser scanner, video camera, and inertial measurements. Experimental and simulation results are presented to illustrate performance of multi-sensor fusion algorithms. 1.0 MOTIVATION Many existing and perspective applications of navigation systems would benefit notably from the ability to navigate accurately and reliably in difficult environments. Examples of difficult navigation scenarios include urban canyons, indoor applications, radio-frequency (RF) interference and jamming environments. In addition, different segments of a mission path can impose significantly different requirements on the navigation sensing technology and data processing algorithms. To exemplify, Figure 1 shows a mission scenario of an autonomous aerial vehicle (UAV). Navigation in Difficult Environments: Multi-Sensor Fusion Techniques 8 2 RTO-EN-SET-116(2011) Figure 1: UAV mission example. For this example, the UAV is deployed in an open field; next, the vehicle enters an urban canyon to perform tasks such as surveillance and reconnaissance; and, finally, it returns to the deployment point. To enable operation of the UAV at any point on the flight path, a precision navigation, attitude, and time capability onboard the vehicle is required. Global Navigation Satellite System (GNSS) generally provides satisfactory performance in open fields and suburban areas, but has fragmented availability in urban environments due to satellite blockages by buildings and other obstacles. Feature-based navigation techniques demonstrate a promising potential in dense urban areas where enough navigation-related features can be extracted from images of digital cameras or laser scanners. However, the feature availability can be limited in relatively open areas. A self-contained inertial navigation system (INS) can provide navigation solution at any environment; however, the solution accuracy drifts over time. In a stand-alone mode, none of the existing navigation technologies has a potential to satisfy the requirements for the navigation accuracy, continuity and availability over the entire duration of the UAV flight. Therefore, multi-sensor fusion techniques are pursued. In other words, to be able to navigate at any environment at any time, it is beneficial to utilize any potential source of navigation related information. Example applications that involve navigation in difficult environments include but are not limited to navigation, guidance and control of autonomous ground vehicle (UGV) and autonomous aerial vehicle (UAV), as well as teams of UGVs and UAVs for urban surveillance and reconnaissance tasks; geographical information system (GIS) data collection for mapping applications on open highways and dense urban environments; indoor search and rescue applications; monitoring of urban infrastructure for situational awareness; and, precise automotive applications such as automated lane keeping. Meter-level to decimeterlevel reliable positioning capabilities are generally needed for these application examples. As stated previously, none of the existing navigation technologies can currently satisfy these requirements or has a potential to provide these capabilities in a stand-alone mode. This paper discusses multi-sensor fusion approaches for navigation in difficult environments. A generic concept of the multi-sensor fusion is first presented. Next, the paper exemplifies applications of the generic multi-sensor fusion concept for the development of specific multi-sensor mechanizations. Specifically, integrated Global Positioning System (GPS)/laser scanner/INS and laser scanner/video camera/INS mechanizations are considered. 2.0 MULTI-SENSOR FUSION APPROACH The generic concept of the multi-sensor navigation utilizes a self-contained inertial navigator as a core navigation sensor. The INS does not rely on any type of external information and as a result can operate in any Navigation in Difficult Environments: Multi-Sensor Fusion Techniques RTO-EN-SET-116(2011) 8 3 environment. However, inertial navigation solution drifts over time [1]. To mitigate INS drift, this core sensor is augmented by reference navigation data sources (such as, for example, GPS or a laser scanner). Reference data sources generally rely on external observations or signals that may or may not be available. Therefore, these sources are treated as secondary sensors. When available, secondary sensors’ measurements are applied to reduce the drift in inertial navigation outputs. Inertial data are used to bridge over reference sensor outages. In addition, inertial data can be applied to improve the robustness of reference sensor signal processing: for instance, to significantly increase the GPS signal integration interval in order to enable processing of very weak GPS signals and to reduce the susceptibility of GPS to radio-frequency interference [2]. Figure 2 illustrates the multi-sensor fusion approach. INS Core-sensor, self-contained Reference data Secondary sensor (GPS) Motion compensation Secondary sensor (Laser scanner) Inertial drift mitigation: Periodically “resets” inertial errors Improved signal processing robustness: Long signal integration for weak signal processing Reference data Motion compensation Secondary sensor (Doppler radar) Reference data Motion compensation Figure 2: Generic multi-sensor fusion approach. Figure 3 provides a more detailed illustration of the interaction between the INS and a secondary navigation sensor. Secondary sensor reference nav solution Signal processing Computation of nav solution Navigation outputs Estimation of inertial errors (Kalman filter) Error estimates Core sensor

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Hierarchical SLAM/GPS/INS Sensor Fusion with WLFP for Flying Robo-SAR's Navigation

In this paper, we present the results of a hierarchical SLAM/GPS/INS/WLFP sensor fusion to be used in navigation system devices. Due to low quality of the inertial sensors, even a short-term GPS failure can lower the integrated navigation performance significantly. In addition, in GPS denied environments, most navigation systems need a separate assisting resource, in order to increase the avail...

متن کامل

Multi-Focus Image Fusion in DCT Domain using Variance and Energy of Laplacian and Correlation Coefficient for Visual Sensor Networks

The purpose of multi-focus image fusion is gathering the essential information and the focused parts from the input multi-focus images into a single image. These multi-focus images are captured with different depths of focus of cameras. A lot of multi-focus image fusion techniques have been introduced using considering the focus measurement in the spatial domain. However, the multi-focus image ...

متن کامل

Multi Sensor Fusion for Simultaneous Localization and Mapping on Autonomous Vehicles

Although many different sensors are nowadays available on autonomous vehicles, the full potential of techniques which integrate information coming from these different sensors to increase the ability of autonomous vehicles of avoiding accidents and, more generally, increase their safety levels remains untapped. Indeed, navigation techniques in static environments based on fusion of multiple sen...

متن کامل

Simultaneous Localization and Mapping Based on Multi-Rate Fusion of Laser and Encoders Measurements

Abstract: The SLAM problem in static environments with EKF is adapted for multi-rate sensor fusion of encoders and laser rangers. In addition, the formulation is general and can be adapted for any multi-rate sensor fusion application. The proposed algorithm, based on well-known techniques for feature extraction, data association and map building, is validated with some experimental results. Thi...

متن کامل

Effective Mechatronic Models and Methods for Implementation an Autonomous Soccer Robot

  Omni directional mobile robots have been popularly employed in several applications especially in soccer player robots considered in Robocup competitions. However, Omni directional navigation system, Omni-vision system and solenoid kicking mechanism in such mobile robots have not ever been combined. This situation brings the idea of a robot with no head direction into existence, a comprehensi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010